NVIDIA Control Panel


NVIDIA Control Panel
Best NVIDIA Control Panel Settings
geforce-940mx

How do I activate either Intel graphics GPU or Nvidia GeForce at the same time?

You don't "activate" them, but you can utilize both depending on the software application you're trying to utilize them with.


Example: LuxRender has an OpenCL GPU render option that will use every available OpenCL device if you tell it to.
The on chip graphics get utilized as just another OpenCL device.
In benchmarking of a few different render engines, comparing i7s to comparable Xeons, we were able to find slight performance benefits from the i7 because we were able to utilize the integrated graphics for OpenCL compute tasks.

BUT the improvement, while significant in real world terms, was limited in practical application.
Yes, it reduced render times by 4-6%, and yes, when you're looking at a reduction of 36 minutes off of a 12 hour render, that is a real world benefit...
But if your compute task would complete in 12 minutes instead of 12 hours, then you would be looking at a reduction of 36 seconds....
which is less worth while.

So, you can utilize the on chip graphics processing that's hanging out there otherwise idle if the software you're using is written to do so...
but it may or may not be worth while to do so.
For applications that would gain little benefit, it doesn't make much sense for developers to write software to function that way.

Both cards can remain active at the same time, but not for the same application.
The integrated graphic chipset is the one responsible for switching on the dedicated graphic card when an application, such as a game, requires it, and then switching it off again when you exit that application.
The integrated one always stays on, even though it doesn't run the application which is being run by the dedicated one.
However, its possible that the integrated card might be running some other process or application.


Using a custom driver might enable you to use both the cards for the same application, however it will only reduce your overall performance since the difference between the 2 GPUs is too much.
In such a case, the performance of the dedicated card will be reduced to match the integrated one.
In fact, this is what happens with AMD Dual Graphics in laptops, in which 2 GPUs(one integrated and one dedicated) with similar performance work together to run an application(usually through alternate frame rendering).
However that process doesn't work well with all games, as most of the times the dedicated one performs better alone.

Switchable graphics laptops normally disable one of the cards and this card can not be used for display or computation.
I am not aware of a switching software which activates both cards at the same time.
However, if you use Linux and do not install any switching software both cards may be active at the same time.
Then, you can configure xorg to use the integrated graphics card and you can use discrete card for computation.
It is the case for my relatively old ATI switchable graphics laptop and I need to install some additional modules to disable one of the cards for power saving.
Otherwise both cards consume power despite one is not used for anything.


A similar situation may occure on Windows if you don't install switching software.
However, I am not sure about how to select the card for display on Windows.

Also keep in mind that even if the discrete card is chosen for display, it can still be used for computation.

You don't say what platform you are using, but I presume that you have an on-board Intel graphics and an add-on NVIDIA card.

Generally, there is a BIOS setting that drives whether the on-board (Intel), add-in (NVIDIA) or both systems are used.
How to make that setting varies by platforms, and some platforms have less granularity of configuration than others.

I have observed that the default for most systems is that the add-in video card overrides and disables the on-board video.



how to activate graphics card on laptop

Laptops with intel HD graphics and nvidia cards use something called "Optimus" which causes the laptop to only use the intel HD graphics while in windows, and only use the nvidia card while in games.

This saves you a LOT in battery usage.

This is why dxdiag will not detect your graphics card--it's not active within windows.

Don't worry--you don't need to use your nvidia card while in windows.

By default, most games will use the nvidia card, but you can make sure of that in your nvidia control panel.